A B C D E F G H I J K L M N O P Q R S T U V W X Y Z All
Sayyad, S. S.
- Radial Basis Function Network Construction Using Modified Gram Schmidt Algorithm
Authors
1 Department of CSE, Annasaheb Dange College of Engineering and Technology, Ashta 416301, IN
Source
Artificial Intelligent Systems and Machine Learning, Vol 8, No 8 (2016), Pagination: 279-285Abstract
The Neural network provides the frame work of self-learning system. It allows the user to design the work around huge amount of data, in dynamic and nonlinear form. By using this system, experts, technical peoples and developers in this domain are trying to make the widely used system in form of autonomous system. The Radial Basis Function Neural Network is come under the supervised type learning neural network. It has been widely used in neural network problems like classification, signal processing, pattern recognition, fault diagnosis etc. The construction of any type of neural network is very an important task. Network must be constructed in such way that it can adapt to any of problem and learn the network with computational efficient and compact manner. While building initial model of Radial Basis Function Network important issues are to determine of hidden layer parameters (center and width) and output weights. A proposed algorithm for Radial Basis Function Neural Network construction is suggested to deal with important issues of Network construction. It is designed for classification problem. A suggested algorithm is mainly divided into four parts that are K-mean algorithm, initial model construction, and performance evaluation and based on MSE value initially computed output weights are optimized using Particle Swarm Optimization method. Again the optimized Radial Basis Function network is constructed based new value of output weights. Final, part is performance evaluation of optimized model and initial model. To do the analysis of both models it is trained and tested on datasets from UCI repositories. Analysis of both networks is carried out on testing samples. The performance of training, testing phases of network are measured based upon classification accuracy, number of samples misclassified and time required for computation.
Keywords
Artificial Neural Network (ANN), K-Means, Mean Squared Error (MSE), Particle Swarm Optimization (PSO), Radial Basis Function Neural (RBFN) Network.- Multilayer Perceptron Using Levenberg Marquardt Algorithm for Imbalanced Data
Authors
1 Annasaheb Dange College of Engineering & Technology, Ahsta, Sangli, IN
Source
Programmable Device Circuits and Systems, Vol 10, No 5 (2018), Pagination: 85-88Abstract
Supervised Learning algorithms learn from training dataset. If dataset has unequal distribution among number of samples of classes, classifier gives inaccurate results for minor class and correct results for major class. This paper studies the cost sensitive learning of Multilayer Perceptron (MLP) neural network to overcome this problem. This cost sensitive approach is for binary dataset which modifies the objective function. By this change, network produces accurate results for minor class also. Levenberg-Marquardt (LM) weight updation algorithm is used for network learning. It is fast and efficient algorithm for neural network training rather than other weight updation algorithms. Statistical results performed on real time dataset, shows that how fast the training is with LM algorithm. Also the result shows that MLP classifier produces accurate results for both major and minor dataset.
Keywords
Imbalanced Data, Cost Sensitive Learning, Multilayer Perceptron Neural Network, Levenberg-Marquardt Algorithm.References
- Cristiano L. Castro,Antonio P. Braga, “Novel Cost-Sensitive Approach to Improve the Multilayer Perceptron Performance on Imbalanced Data” IEEE transaction on neural networks and learning systems, Vol. 24, no. 6, June 2013
- Foster Provost,“Machine Learning from Imbalanced Data Sets 101” New York University
- Nathalie Japkowicz, Shaju Stephen, “The class imbalance problem - A systematic study”Intelligent Data Analysis Volume 6 Issue 5, October 2002
- G. M. Weiss,“Mining with rarity: A unifying framework” ACM SIGKDD ExplorationsNewslett., vol. 6, no. 1, pp. 719, Jun. 2004.
- Zhi-Hua Zhou,Xu-Ying Liu, “Training Cost-Sensitive Neural Networks with Methods Addressing the Class Imbalance Problem” IEEE transaction on knowledge and data engineering
- Cristiano Leite Castro, Antonio Padua Braga, “ Artificial neural networks learning in ROC Space” in Proc. Int. Joint Conf. Comput., 2009, pp. 484489.
- R. Alejo, V.Garcia, J.M.Sotoca, R.A.Mollineda and J.S.Sanchez,“Improving the Performance of the RBF Neural Networks Trained with Imbalanced Samples” in Computational and Ambient Intelligence (Lecture Notes in Computer Science), vol. 4507. New York, USA:Springer-Verlag, 2007, pp. 162169.
- S.H. Oh, “Error back-propagation algorithm for classification of imbalanced data”, Neurocomputing, vol. 74, no. 6, pp. 10581061, Feb. 2011
- Hao Yu Bogdan M. ,Wilamowski, “Levenberg Marquadt Training”,Auburn University
- M. T. Hagan ,M. B. Menhaj, “Training feedforward networks with the Marquardt algorithm,IEEE Trans. Neural Netw., vol. 5, no. 6, pp. 989993, Nov. 1994.
- Bogdan M. Wilamowski,Hao Yu, “Improved Computation for Levenberg Marquardt Training”, IEEE transaction on neural networks, Vol. 21, No. 6, June 2010
- A. Frank and A. Asuncion, UCI machine learning repository, (2010)
- [Online]. Available: http://archive.ics.uci.edu/ml
- Shinde, S.B.,Sayyad, S.S.:Cost Sensitive Improved Levenberg Marquardt Algorithm for Imbalanced Data,Computational Intelligence and Computing Research (ICCIC), IEEE International Conference on IEEE, pp. 1-4, (2016).
- Shinde, S.B., Sayyad S. S. ,Mulla, A.N.:Multilayer Perceptron Using Modified Levenberg Marquardt Algorithm for Imbalanced Data,Inventi Impact: Soft Computing, vol.4, pp. 209-215, (2015)